Boosting on Manifolds: Adaptive Regularization of Base Classifiers
نویسندگان
چکیده
In this paper we propose to combine two powerful ideas, boosting and manifold learning. On the one hand, we improve ADABOOST by incorporating knowledge on the structure of the data into base classifier design and selection. On the other hand, we use ADABOOST’s efficient learning mechanism to significantly improve supervised and semi-supervised algorithms proposed in the context of manifold learning. Beside the specific manifold-based penalization, the resulting algorithm also accommodates the boosting of a large family of regularized learning algorithms.
منابع مشابه
Boosting with Spatial Regularization
By adding a spatial regularization kernel to a standard loss function formulation of the boosting problem, we develop a framework for spatially informed boosting. From this regularized loss framework we derive an efficient boosting algorithm that uses additional weights/priors on the base classifiers. We prove that the proposed algorithm exhibits a “grouping effect”, which encourages the select...
متن کاملOn the Rate of Convergence of Regularized Boosting Classifiers
A regularized boosting method is introduced, for which regularization is obtained through a penalization function. It is shown through oracle inequalities that this method is model adaptive. The rate of convergence of the probability of misclassification is investigated. It is shown that for quite a large class of distributions, the probability of error converges to the Bayes risk at a rate fas...
متن کاملConvergence and Consistency of Regularized Boosting Algorithms with Stationary β-Mixing Observations
We study the statistical convergence and consistency of regularized Boosting methods, where the samples are not independent and identically distributed (i.i.d.) but come from empirical processes of stationary β-mixing sequences. Utilizing a technique that constructs a sequence of independent blocks close in distribution to the original samples, we prove the consistency of the composite classifi...
متن کاملConvergence and Consistency of Regularized Boosting Algorithms with Stationary B-Mixing Observations
We study the statistical convergence and consistency of regularized Boosting methods, where the samples are not independent and identically distributed (i.i.d.) but come from empirical processes of stationary β-mixing sequences. Utilizing a technique that constructs a sequence of independent blocks close in distribution to the original samples, we prove the consistency of the composite classifi...
متن کاملSignature Verification using Integrated Classifiers
This paper presents a new approach for off-line signature verification. The proposed system is based on global, grid, ink distribution and texture features. The Boosting algorithm is applied to train and integrate multiple classifiers, and the distance-based classifier used as the base classifier corresponding to each feature set. Adaptive threshold is associated with individuality. Experimenta...
متن کامل